Relative Entropy, Probabilistic Inference, and AI
نویسنده
چکیده
is an information-theoretic measure of the dissimilarity between q = q 1, • • · ,qn and p = p 11 • • • , pn (H is also called cross-entropy, discrimination information, directed divergence, !-divergence, K-L number, among other terms). Various properties of relative entropy have led to its widespread use in information theory. These properties suggest that relative entropy has a role to play in systems that attempt to perform inference in terms of probability distributions. In this paper, I will review some basic properties of relative entropy as well as its role in probabilistic inference. I will also mention briefly a few existing and potential applications of relative entropy to so-called artificial intelligence (AI).
منابع مشابه
An Inequality Paradigm for Probabilistic Knowledge
An important aspect of knowledge representation in AI systems is how to represent and reason with probabilistic statements. We observe that a starting set of probabilistic statements, each assigning a unique value to the probability of some sentence (perhaps condriional on some other sentence), in general does not determine a unique value for every sentence of interest. Rather, this probabilist...
متن کاملMaximum Entropy and Maximum Probability
Sanov’s Theorem and the Conditional Limit Theorem (CoLT) are established for a multicolor Pólya Eggenberger urn sampling scheme, giving the Pólya divergence and the Pólya extension to the Maximum Relative Entropy (MaxEnt) method. Pólya MaxEnt includes the standard MaxEnt as a special case. The universality of standard MaxEnt advocated by an axiomatic approach to inference for inverse problems i...
متن کاملInference of Markov Chain: AReview on Model Comparison, Bayesian Estimation and Rate of Entropy
This article has no abstract.
متن کاملOn Prototypical Indifference and Lifted Inference in Relational Probabilistic Conditional Logic
Semantics for formal models of probabilistic reasoning rely on probability functions that are defined on the interpretations of the underlying classical logic. When this underlying logic is of relational nature, i. e. a fragment of first-order logic, then the space needed for representing these probability functions explicitly is exponential in both the number of predicates and the number of do...
متن کاملThe Social Entropy Process: Axiomatising the Aggregation of Probabilistic Beliefs
The present work stems from a desire to combine ideas arising from two historically different schemes of probabilistic reasoning, each having its own axiomatic traditions, into a single broader axiomatic framework, capable of providing general new insights into the nature of probabilistic inference in a collective context. In the present sketch of our work we describe briefly the background con...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1985